287 research outputs found

    Airborne LiDAR for DEM generation: some critical issues

    Get PDF
    Airborne LiDAR is one of the most effective and reliable means of terrain data collection. Using LiDAR data for DEM generation is becoming a standard practice in spatial related areas. However, the effective processing of the raw LiDAR data and the generation of an efficient and high-quality DEM remain big challenges. This paper reviews the recent advances of airborne LiDAR systems and the use of LiDAR data for DEM generation, with special focus on LiDAR data filters, interpolation methods, DEM resolution, and LiDAR data reduction. Separating LiDAR points into ground and non-ground is the most critical and difficult step for DEM generation from LiDAR data. Commonly used and most recently developed LiDAR filtering methods are presented. Interpolation methods and choices of suitable interpolator and DEM resolution for LiDAR DEM generation are discussed in detail. In order to reduce the data redundancy and increase the efficiency in terms of storage and manipulation, LiDAR data reduction is required in the process of DEM generation. Feature specific elements such as breaklines contribute significantly to DEM quality. Therefore, data reduction should be conducted in such a way that critical elements are kept while less important elements are removed. Given the highdensity characteristic of LiDAR data, breaklines can be directly extracted from LiDAR data. Extraction of breaklines and integration of the breaklines into DEM generation are presented

    Compressive Network Analysis

    Full text link
    Modern data acquisition routinely produces massive amounts of network data. Though many methods and models have been proposed to analyze such data, the research of network data is largely disconnected with the classical theory of statistical learning and signal processing. In this paper, we present a new framework for modeling network data, which connects two seemingly different areas: network data analysis and compressed sensing. From a nonparametric perspective, we model an observed network using a large dictionary. In particular, we consider the network clique detection problem and show connections between our formulation with a new algebraic tool, namely Randon basis pursuit in homogeneous spaces. Such a connection allows us to identify rigorous recovery conditions for clique detection problems. Though this paper is mainly conceptual, we also develop practical approximation algorithms for solving empirical problems and demonstrate their usefulness on real-world datasets

    VGI and crowdsourced data credibility analysis using spam email detection techniques

    Get PDF
    Volunteered geographic information (VGI) can be considered a subset of crowdsourced data (CSD) and its popularity has recently increased in a number of application areas. Disaster management is one of its key application areas in which the benefits of VGI and CSD are potentially very high. However, quality issues such as credibility, reliability and relevance are limiting many of the advantages of utilising CSD. Credibility issues arise as CSD come from a variety of heterogeneous sources including both professionals and untrained citizens. VGI and CSD are also highly unstructured and the quality and metadata are often undocumented. In the 2011 Australian floods, the general public and disaster management administrators used the Ushahidi Crowd-mapping platform to extensively communicate flood-related information including hazards, evacuations, emergency services, road closures and property damage. This study assessed the credibility of the Australian Broadcasting Corporation’s Ushahidi CrowdMap dataset using a Naïve Bayesian network approach based on models commonly used in spam email detection systems. The results of the study reveal that the spam email detection approach is potentially useful for CSD credibility detection with an accuracy of over 90% using a forced classification methodology

    Flood inundation mapping using hydraulic modelling and GIS: a case study in the West Creek sub-catchment

    Get PDF
    In recent years, climate change has caused extreme climate conditions. This intensifies and increases the amount of rainfall that caused floods in many regions of the world. The recent floods in Queensland, Australia provide evidence of the effects of increased climate change to the state and its population. The flash flood that occurred on the 10th of January 2011 in the West Creek catchment in the City of Toowoomba was a sudden and unexpected event making it difficult to implement flood mitigating/preventive measures. To reduce the impact of flood damage, this study aimed to develop an improved flood inundation model in the part of West Creek catchment using Geographic information systems (GIS) and the HEC-RAS hydraulic model. A digital elevation model (DEM) derived from LiDAR data was the primary data source for flood modelling. The geometric data (e.g. stream centreline, banks, flow path centreline and cross-sections, etc.) were extracted from the DEM and used in the analysis. A high resolution satellite image was used to classify land cover. Roughness coefficients were assigned according to different land cover types. Field measurements were also conducted to support the modelling process. These include measuring culverts, stream cross-sections, etc. The result was flood inundation map that clearly shows the spatial extent of the flooded area along part of West Creek and lower elevation areas within the catchment

    Sparse Approximate Multifrontal Factorization with Butterfly Compression for High Frequency Wave Equations

    Full text link
    We present a fast and approximate multifrontal solver for large-scale sparse linear systems arising from finite-difference, finite-volume or finite-element discretization of high-frequency wave equations. The proposed solver leverages the butterfly algorithm and its hierarchical matrix extension for compressing and factorizing large frontal matrices via graph-distance guided entry evaluation or randomized matrix-vector multiplication-based schemes. Complexity analysis and numerical experiments demonstrate O(Nlog2N)\mathcal{O}(N\log^2 N) computation and O(N)\mathcal{O}(N) memory complexity when applied to an N×NN\times N sparse system arising from 3D high-frequency Helmholtz and Maxwell problems

    Re-Expression of AKAP12 Inhibits Progression and Metastasis Potential of Colorectal Carcinoma In Vivo and In Vitro

    Get PDF
    Background: AKAP12/Gravin (A kinase anchor protein 12) is one of the A-kinase scaffold proteins and a potential tumor suppressor gene in human primary cancers. Our recent study demonstrated the highly recurrent loss of AKAP12 in colorectal cancer and AKAP12 reexpression inhibited proliferation and anchorage-independent growth in colorectal cancer cells, implicating AKAP12 in colorectal cancer pathogenesis. Methods: To evaluate the effect of this gene on the progression and metastasis of colorectal cancer, we examined the impact of overexpressing AKAP12 in the AKAP12-negative human colorectal cancer cell line LoVo, the single clone (LoVo-AKAP12) compared to mock-transfected cells (LoVo-CON). Results: pCMV6-AKAP12-mediated AKAP12 re-expression induced apoptosis (3 % to 12.7%, p,0.01), migration (89.667.5 cells to 31.064.1 cells, p,0.01) and invasion (82.765.2 cells to 24.763.3 cells, p,0.01) of LoVo cells in vitro compared to control cells. Nude mice injected with LoVo-AKAP12 cells had both significantly reduced tumor volume (p,0.01) and increased apoptosis compared to mice given AKAP12-CON. The quantitative human-specific Alu PCR analysis showed overexpression of AKAP12 suppressed the number of intravasated cells in vivo (p,0.01). Conclusion: These results demonstrate that AKAP12 may play an important role in tumor growth suppression and the survival of human colorectal cancer

    Hybrid composites of silica glass fibre/nano-hydroxyapatite/polylactic acid for medical application

    Get PDF
    Fibre reinforced composites (FRC) have shown great potential for the application of internal bone fixation due to mechanical properties that are similar to those of human cortical bones. Ternary composites of silica glass fibres, nano-hydroxyapatite (n-HA) and polylactic acid (PLA) were prepared by compression moulding and their mechanical properties were characterized in this study. With the volumetric content of glass fibre remained constantly at 30% and the volume fraction of n-HA increased from 0% to 5%, the flexural strengths of composites decreased from 625.68 MPa to 206.55 MPa, whereas a gradual increment of flexural modulus from 11.01 to 14.08 GPa were observed at the same time. Within a 28-day degradation period, the flexural strengths decreased by 30%, while no obvious trend of modulus variation was found. The flexural properties of all composites prepared in this study were all found to be close to the reported flexural properties. On the other hand, as more n-HA were incorporated, the water absorption percentages increased, whereas negligible mass loss were recorded. SEM images revealed that the impregnation of fibre mats was poor as loose fibres were observed, which shall be solved in future research to further improve the mechanical properties as well as endurance against degradation. © 2017 International Committee on Composite Materials. All rights reserved

    Semantic location extraction from crowdsourced data

    Get PDF
    Crowdsourced Data (CSD) has recently received increased attention in many application areas including disaster management. Convenience of production and use, data currency and abundancy are some of the key reasons for attracting this high interest. Conversely, quality issues like incompleteness, credibility and relevancy prevent the direct use of such data in important applications like disaster management. Moreover, location information availability of CSD is problematic as it remains very low in many crowd sourced platforms such as Twitter. Also, this recorded location is mostly related to the mobile device or user location and often does not represent the event location. In CSD, event location is discussed descriptively in the comments in addition to the recorded location (which is generated by means of mobile device's GPS or mobile communication network). This study attempts to semantically extract the CSD location information with the help of an ontological Gazetteer and other available resources. 2011 Queensland flood tweets and Ushahidi Crowd Map data were semantically analysed to extract the location information with the support of Queensland Gazetteer which is converted to an ontological gazetteer and a global gazetteer. Some preliminary results show that the use of ontologies and semantics can improve the accuracy of place name identification of CSD and the process of location information extraction
    corecore